108 research outputs found

    An Improved Algorithm for Generating Database Transactions from Relational Algebra Specifications

    Full text link
    Alloy is a lightweight modeling formalism based on relational algebra. In prior work with Fisler, Giannakopoulos, Krishnamurthi, and Yoo, we have presented a tool, Alchemy, that compiles Alloy specifications into implementations that execute against persistent databases. The foundation of Alchemy is an algorithm for rewriting relational algebra formulas into code for database transactions. In this paper we report on recent progress in improving the robustness and efficiency of this transformation

    Clinical and laboratory practice for lupus anticoagulant testing : an International Society of Thrombosis and Haemostasis Scientific and Standardization Committee survey

    Get PDF
    Background Current guidelines have contributed to more uniformity in the performance and interpretation of lupus anticoagulant (LA) testing. However, points to reconsider include testing for LA in patients on anticoagulation, cut-off values, and interpretation of results. Objectives The aim of this International Society of Thrombosis and Haemostasis Scientific and Standardization committee (ISTH SSC) questionnaire was to capture the spectrum of clinical and laboratory practice in LA detection, focusing on variability in practice, so that the responses could inform further ISTH SSC recommendations. Methods Members of the ISTH SSC on Lupus Anticoagulant/Antiphospholipid Antibodies and participants of the Lupus Anticoagulant/Antiphospholipid Antibodies Programme of the External quality Control of diagnostic Assays and Tests Foundation were invited to complete a questionnaire on LA testing that was placed on the ISTH website using RedCap, with data tallied using simple descriptive statistics. Results There was good agreement on several key recommendations in the ISTH and other guidelines on LA testing, such as sample processing, principles of testing, choice of tests, repeat testing to confirm persistent positivity and the use of interpretative reporting. However, the results highlight that there is less agreement on some other aspects, including the timing of testing in relation to thrombosis or pregnancy, testing in patients on anticoagulation, cut-off values, and calculation and interpretation of results. Conclusions Although some of the variability in practice in LA testing reflects the lack of substantive data to underpin evidence-based recommendations, a more uniform approach, based on further guidance, should reduce the inter-center variability of LA testing

    Verifying Temporal Regular Properties of Abstractions of Term Rewriting Systems

    Get PDF
    The tree automaton completion is an algorithm used for proving safety properties of systems that can be modeled by a term rewriting system. This representation and verification technique works well for proving properties of infinite systems like cryptographic protocols or more recently on Java Bytecode programs. This algorithm computes a tree automaton which represents a (regular) over approximation of the set of reachable terms by rewriting initial terms. This approach is limited by the lack of information about rewriting relation between terms. Actually, terms in relation by rewriting are in the same equivalence class: there are recognized by the same state in the tree automaton. Our objective is to produce an automaton embedding an abstraction of the rewriting relation sufficient to prove temporal properties of the term rewriting system. We propose to extend the algorithm to produce an automaton having more equivalence classes to distinguish a term or a subterm from its successors w.r.t. rewriting. While ground transitions are used to recognize equivalence classes of terms, epsilon-transitions represent the rewriting relation between terms. From the completed automaton, it is possible to automatically build a Kripke structure abstracting the rewriting sequence. States of the Kripke structure are states of the tree automaton and the transition relation is given by the set of epsilon-transitions. States of the Kripke structure are labelled by the set of terms recognized using ground transitions. On this Kripke structure, we define the Regular Linear Temporal Logic (R-LTL) for expressing properties. Such properties can then be checked using standard model checking algorithms. The only difference between LTL and R-LTL is that predicates are replaced by regular sets of acceptable terms

    Categorical Models for a Semantically Linear Lambda-calculus

    Full text link
    This paper is about a categorical approach to model a very simple Semantically Linear lambda calculus, named Sll-calculus. This is a core calculus underlying the programming language SlPCF. In particular, in this work, we introduce the notion of Sll-Category, which is able to describe a very large class of sound models of Sll-calculus. Sll-Category extends in the natural way Benton, Bierman, Hyland and de Paiva's Linear Category, in order to soundly interpret all the constructs of Sll-calculus. This category is general enough to catch interesting models in Scott Domains and Coherence Spaces

    A post-glacial sea level hinge on the central Pacific coast of Canada

    Get PDF
    Post-glacial sea level dynamics during the last 15,000 calendar years are highly variable along the Pacific coast of Canada. During the Last Glacial Maximum, the Earth\u27s crust was depressed by ice loading along the mainland inner coast and relative sea levels were as much as 200 m higher than today. In contrast, some outer coastal areas experienced a glacial forebulge (uplift) effect that caused relative sea levels to drop to as much as 150 m below present levels. Between these inner and outer coasts, we hypothesize that there would have been an area where sea level remained relatively stable, despite regional and global trends in sea level change. To address this hypothesis, we use pond basin coring, diatom analysis, archaeological site testing, sedimentary exposure sampling, and radiocarbon dating to construct sea level histories for the Hakai Passage region. Our data include 106 newly reported radiocarbon ages from key coastal sites that together support the thesis that this area has experienced a relatively stable sea level over the last 15,000 calendar years. These findings are significant in that they indicate a relatively stable coastal environment amenable to long-term human occupation and settlement of the area. Our results will help inform future archaeological investigations in the region

    Resource-Bound Quantification for Graph Transformation

    Full text link
    Graph transformation has been used to model concurrent systems in software engineering, as well as in biochemistry and life sciences. The application of a transformation rule can be characterised algebraically as construction of a double-pushout (DPO) diagram in the category of graphs. We show how intuitionistic linear logic can be extended with resource-bound quantification, allowing for an implicit handling of the DPO conditions, and how resource logic can be used to reason about graph transformation systems

    Inhibitory Control Deficits Associated with Upregulation of CB1R in the HIV-1 Tat Transgenic Mouse Model of Hand

    Get PDF
    In the era of combined antiretroviral therapy, HIV-1 infected individuals are living longer lives; however, longevity is met with an increasing number of HIV-1 associated neurocognitive disorders (HAND) diagnoses. The transactivator of transcription (Tat) is known to mediate the neurotoxic effects in HAND by acting directly on neurons and also indirectly via its actions on glia. The Go/No-Go (GNG) task was used to examine HAND in the Tat transgenic mouse model. The GNG task involves subjects discriminating between two stimuli sets in order to determine whether or not to inhibit a previously trained response. Data reveal inhibitory control deficits in female Tat(+) mice (p = .048) and an upregulation of cannabinoid type 1 receptors (CB1R) in the infralimbic (IL) cortex in the same female Tat(+) group (p < .05). A significant negative correlation was noted between inhibitory control and IL CB1R expression (r = -.543, p = .045), with CB1R expression predicting 30% of the variance of inhibitory control (R(2) = .295, p = .045). Furthermore, there was a significant increase in spontaneous excitatory postsynaptic current (sEPSC) frequencies in Tat(+) compared to Tat(-) mice (p = .008, across sexes). The increase in sEPSC frequency was significantly attenuated by bath application of PF3845, a fatty acid amide hydrolase (FAAH) enzyme inhibitor (p < .001). Overall, the GNG task is a viable measure to assess inhibitory control deficits in Tat transgenic mice and results suggest a potential therapeutic treatment for the observed deficits with drugs which modulate endocannabinoid enzyme activity. Graphical Abstract Results of the Go/No-Go operant conditioning task reveal inhibitory control deficits in female transgenic Tat(+) mice without significantly affecting males. The demonstrated inhibitory control deficits appear to be associated with an upregulation of cannabinoid type 1 receptors (CB1R) in the infralimbic (IL) cortex in the same female Tat(+) group

    Object-oriented Programming Laws for Annotated Java Programs

    Full text link
    Object-oriented programming laws have been proposed in the context of languages that are not combined with a behavioral interface specification language (BISL). The strong dependence between source-code and interface specifications may cause a number of difficulties when transforming programs. In this paper we introduce a set of programming laws for object-oriented languages like Java combined with the Java Modeling Language (JML). The set of laws deals with object-oriented features taking into account their specifications. Some laws deal only with features of the specification language. These laws constitute a set of small transformations for the development of more elaborate ones like refactorings

    On Linear Information Systems

    Get PDF
    Scott's information systems provide a categorically equivalent, intensional description of Scott domains and continuous functions. Following a well established pattern in denotational semantics, we define a linear version of information systems, providing a model of intuitionistic linear logic (a new-Seely category), with a "set-theoretic" interpretation of exponentials that recovers Scott continuous functions via the co-Kleisli construction. From a domain theoretic point of view, linear information systems are equivalent to prime algebraic Scott domains, which in turn generalize prime algebraic lattices, already known to provide a model of classical linear logic
    corecore